12 research outputs found

    Neural correlates of visualizations of concrete and abstract words in preschool children: A developmental embodied approach

    Get PDF
    The neural correlates of visualization underlying word comprehension were examined in preschool children. On each trial, a concrete or abstract word was delivered binaurally (part 1: post-auditory visualization), followed by a four-picture array (a target plus three distractors; part 2: matching visualization). Children were to select the picture matching the word they heard in part 1. Event-related potentials (ERPs) locked to each stimulus presentation and task interval were averaged over sets of trials of increasing word abstractness. ERP time-course during both parts of the task showed that early activity (i.e., <300 ms) was predominant in response to concrete words, while activity in response to abstract words became evident only at intermediate (i.e., 300-699 ms) and late (i.e., 700-1000 ms) ERP intervals. Specifically, ERP topography showed that while early activity during post-auditory visualization was linked to left temporo-parietal areas for concrete words, early activity during matching visualization occurred mostly in occipito-parietal areas for concrete words, but more anteriorly in centro-parietal areas for abstract words. In intermediate ERPs, post-auditory visualization coincided with parieto- occipital and parieto-frontal activity in response to both concrete and abstract words, while in matching visualization a parieto-central activity was common to both types of words. In the late ERPs for both types of words, the post-auditory visualization involved right-hemispheric activity following a "post-anterior" pathway sequence: occipital, parietal, and temporal areas; conversely, matching visualization involved left-hemispheric activity following an "ant-posterior" pathway sequence: frontal, temporal, parietal, and occipital areas. These results suggest that, similarly, for concrete and abstract words, meaning in young children depends on variably complex visualization processes integrating visuo-auditory experiences and supramodal embodying representations

    A Multi-Lab Test of the Facial Feedback Hypothesis by the Many Smiles Collaboration

    Get PDF
    Following theories of emotional embodiment, the facial feedback hypothesis suggests that individuals’ subjective experiences of emotion are influenced by their facial expressions. However, evidence for this hypothesis has been mixed. We thus formed a global adversarial collaboration and carried out a preregistered, multicentre study designed to specify and test the conditions that should most reliably produce facial feedback effects. Data from n = 3,878 participants spanning 19 countries indicated that a facial mimicry and voluntary facial action task could both amplify and initiate feelings of happiness. However, evidence of facial feedback effects was less conclusive when facial feedback was manipulated unobtrusively via a pen-in-mouth task

    A multi-lab test of the facial feedback hypothesis by the Many Smiles Collaboration

    Get PDF
    Following theories of emotional embodiment, the facial feedback hypothesis suggests that individuals' subjective experiences of emotion are influenced by their facial expressions. However, evidence for this hypothesis has been mixed. We thus formed a global adversarial collaboration and carried out a preregistered, multicentre study designed to specify and test the conditions that should most reliably produce facial feedback effects. Data from n = 3,878 participants spanning 19 countries indicated that a facial mimicry and voluntary facial action task could both amplify and initiate feelings of happiness. However, evidence of facial feedback effects was less conclusive when facial feedback was manipulated unobtrusively via a pen-in-mouth task

    Appraisal of space words and allocation of emotion words in bodily space

    Get PDF
    The body-specificity hypothesis (BSH) predicts that right-handers and left-handers allocate positive and negative concepts differently on the horizontal plane, i.e., while left-handers allocate negative concepts on the right-hand side of their bodily space, right-handers allocate such concepts to the left-hand side. Similar research shows that people, in general, tend to allocate positive and negative concepts in upper and lower areas, respectively, in relation to the vertical plane. Further research shows a higher salience of the vertical plane over the horizontal plane in the performance of sensorimotor tasks. The aim of the paper is to examine whether there should be a dominance of the vertical plane over the horizontal plane, not only at a sensorimotor level but also at a conceptual level. In Experiment 1, various participants from diverse linguistic backgrounds were asked to rate the words “up”, “down”, “left”, and “right”. In Experiment 2, right-handed participants from two linguistic backgrounds were asked to allocate emotion words into a square grid divided into four boxes of equal areas. Results suggest that the vertical plane is more salient than the horizontal plane regarding the allocation of emotion words and positively-valenced words were placed in upper locations whereas negatively-valenced words were placed in lower locations. Together, the results lend support to the BSH while also suggesting a higher saliency of the vertical plane over the horizontal plane in the allocation of valenced words.Fernando Marmolejo-Ramos, María Rosa Elosúa, Yuki Yamada, Nicholas Francis Hamm and Kimihiro Noguch

    Words that move us.: The effects of sentences on body sway

    No full text
    According to the embodied cognition perspective, cognitive systems and perceptuo-motor systems are deeply intertwined and exert a causal effect on each other. A prediction following from this idea is that cognitive activity can result in subtle changes in observable movement. In one experiment, we tested whether reading various sentences resulted in changes in postural sway. Sentences symbolized various human activities involving high, low, or no physical effort. Dutch participants stood upright on a force plate, measuring the body center of pressure, while reading a succession of sentences. High physical effort sentences resulted in more postural sway (greater SD) than low physical effort sentences. This effect only showed up in medio-lateral sway but not anterio-posterior sway. This suggests that sentence comprehension was accompanied by subtle motoric activity, likely mirroring the various activities symbolized in the sentences. We conclude that semantic processing reaches the motor periphery, leading to increased postural activity

    Words That Move Us. The Effects of Sentences on Body Sway

    No full text

    Event-related potential signatures of perceived and imagined emotional and food real-life photos

    No full text
    Although food and affective pictures share similar emotional and motivational characteristics, the relationship between the neuronal responses to these stimuli is unclear. Particularly, it is not known whether perceiving and imagining food and affective stimuli elicit similar event-related potential (ERP) patterns. In this study, two ERP correlates, the early posterior nega

    Drawing sounds: representing tones and chords spatially

    No full text
    Research on the crossmodal correspondences has revealed that seemingly unrelated perceptual information can be matched across the senses in a manner that is consistent across individuals. An interesting extension of this line of research is to study how sensory information biases action. In the present study, we investigated whether different sounds (i.e. tones and piano chords) would bias participants' hand movements in a free movement task. Right-handed participants were instructed to move a computer mouse in order to represent three tones and two chords. They also had to rate each sound in terms of three visual analogue scales (slow-fast, unpleasant-pleasant, and weak-strong). The results demonstrate that tones and chords influence hand movements, with higher-(lower-)pitched sounds giving rise to a significant bias towards upper (lower) locations in space. These results are discussed in terms of the literature on forward models, embodied cognition, crossmodal correspondences, and mental imagery. Potential applications sports and rehabilitation are discussed briefly
    corecore